Web Survey Bibliography
Relevance & Research Question
When designing questionnaires, an important decision to make is whether or not to include a ‘do-not-know-option’. In interviews this dilemma is solved by not explicitly offering ‘do-not-know’, but accepting it when it occurs. Interviewers are instructed to accept a non-substantive answer only after a gentle probe.
Online surveys, being self-administered, lack an interviewer. Therefore, web survey designers are hesitant to offer an explicit do-not-know option and ‘required answer’ is often default standard software. However survey methodologist strongly advice against this forced-answer strategy. Requiring an answer does not necessarily ensure that the right answer is given and may lead to irritation and more break-offs or to guessing and less valid answers, thereby reducing data quality.
Methods and Data
The data were collected among members of the LISS panel, a probability based panel of the Dutch population. The questionnaire contained questions, which in previous self-administered surveys showed a high percentage of item-nonresponse. A three by two experimental design was used. Factor A manipulated no explicit offering vs offering do-not-know in two different ways, visually separating do-not know and offering do-not know as a special button. Factor B manipulated accepting a do-not-know vs. only accepting it after a friendly probe. Respondents were randomly assigned to experimental conditions.
Results
We found clear effects of offering ‘do-not-know’ and of probing. Not explicitly offering do-not-know (but allowing to skip) followed by a friendly probe resulted in the lowest amount of missing information. Respondent evaluations showed that when do-not-know was offered explicitly the questions were experienced as less difficult. When a probe was offered, respondents indicated that the questions made them think more (about the topic). These results suggest that offering a d-not-know without probing gives respondents an easy escape, while probing stimulates the question-answer process. The scale reliabilities support this.
Added Value
This study adds an empirical basis to the debate on whether or not to offer do-not-know options in web surveys. We show that explicitly offering a do-not-know option in a web survey is not advisable. Allowing respondents to skip a question and programming in friendly probes is a good alternative.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - Hox, J. (23)
- Mixed Mode Research: Issues in Design and Analysis; 2016; Hox, J.; De Leeuw, E. D.; Klausch, L. T.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Selection error in single- and mixed mode surveys of the Dutch general population; 2015; Hox, J., Klausch, L. T., Schouten, B.
- Evaluating mixed-mode redesign strategies against benchmark surveys: the case of the Crime Victimization...; 2014; Klausch, L. T., Hox, J., Schouten, B.
- The use of within-subject experiments for estimating measurement effects in mixed-mode surveys ; 2014; Klausch, L. T., Schouten, B., Hox, J.
- Pret met panels [Fun online]; 2013; Roberts, A., de Leeuw, E. D., Hox, J., Klausch, L. T., de Jongh, A.
- Leuker kunnen wij het wel maken. Online vragenlijst design: standaard matrix of scrollmatrix (We can...; 2013; Roberts, A., de Leeuw, E. D., Hox, J., Klausch, L. T., de Jongh, A.
- Internet Coverage and Coverage Bias in Europe: Developments Across Countries and Over Time; 2013; Mohorko, A., de Leeuw, E. D.,Hox, J.
- Measurement Effects of Survey Mode on the Equivalence of Attitudinal Rating Scale Questions; 2013; Klausch, L. T., Hox, J., Hox, J., Schouten, B.
- Random versus Systematic Error in a Mixed Mode Online-Telephone Survey; 2013; Hox, J., Scherpenzeel, A., Boeve, A., Boeve, A., de Leeuw, E. D.
- Estimating Measurement Effects of Survey Modes From Between and Within Subject Designs; 2013; Klausch, L. T., Hox, J., Schouten, B.
- Does one really know?: Avoiding noninformative answers in a reliable way.; 2013; de Leeuw, E. D., Boevee, A., Hox, J.
- Design of Web Questionnaires: Matrix Questions or Single Question Formats ; 2012; de Leeuw, E. D., Hox, J., Klausch, L. T., Roberts, A., de Jongh, A.
- Question or Mode Effects in Mixed-Mode surveys: A Cross-cultural study in the Netherlands, Germany,...; 2012; de Leeuw, E. D., Nicolaas, G., Campanelli, P., Hox, J.
- Assessing Measurement Equivalence and Bias of Questions in Mixed-mode Surveys Under Controlled Sample...; 2012; Klausch, L. T., Hox, J., Schouten, B.
- The Representativity of Web Surveys of the General Population compared to Traditional Modes and Mixed...; 2012; Klausch, L. T., Schouten, B., Hox, J.
- Matrix vs. Single Question Formats in Web Surveys: Results from a large scale experiment; 2012; Klausch, L. T., de Leeuw, E. D., Hox, J., de Jongh, A., Roberts , A.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys; 2011; de Leeuw, E. D., Hox, J., Scherpenzeel, A.
- Measurement Error in Mixed Mode Surveys: Mode or Question Format?; 2011; de Leeuw, E. D., Hox, J.
- Missing data; 2008; de Leeuw, E. D., Hox, J.
- The influence of advance letters on response in telephone surveys; 2007; de Leeuw, E. D., Callegaro, M., Hox, J., Korendijk, E., Lensvelt-Mulders, G. J.
- The effect of computer-assisted interviewing on data quality: A review.; 1995; de Leeuw, E. D., Hox, J., Snijkers, G.